Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise
نویسندگان
چکیده
In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-toone correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ν ≤ 1 and ε > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS(ν, ε) algorithms minimize ε-smooth versions of the lν ‘norms’. We leverage EM theory to show that, for each 0 < ν ≤ 1, the limit points of the sequence of IRLS(ν,ε) iterates are stationary point of the ε-smooth lν ‘norm’ minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS(ν,ε) algorithms is stable for each 0 < ν ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case ν = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for ν < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery. (1) Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139 (2) Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114 (3) Harvard-MIT Division of Health, Sciences and Technology Emails: Behtash Babadi ([email protected]), Demba Ba ([email protected]), Patrick L. Purdon ([email protected]), and Emery N. Brown ([email protected])
منابع مشابه
Convergence and Stability of a Class of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise.
In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribu...
متن کامل[Proceeding] Fast and Robust EM-Based IRLS Algorithm for Sparse Signal Recovery from Noisy Measurements
In this paper, we analyze a new class of iterative re-weighted least squares (IRLS) algorithms and their effectiveness in signal recovery from incomplete and inaccurate linear measurements. These methods can be interpreted as the constrained maximum likelihood estimation under a two-state Gaussian scale mixture assumption on the signal. We show that this class of algorithms, which performs exac...
متن کاملImproved Iteratively Reweighted Least Squares for Unconstrained
In this paper, we first study q minimization and its associated iterative reweighted algorithm for recovering sparse vectors. Unlike most existing work, we focus on unconstrained q minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. Inspired by the results in [Daubechies et al., Comm. Pure Appl. Math., 63 (2010), pp. 1–38] for constrained ...
متن کاملFrames for compressed sensing using coherence
We give some new results on sparse signal recovery in the presence of noise, for weighted spaces. Traditionally, were used dictionaries that have the norm equal to 1, but, for random dictionaries this condition is rarely satised. Moreover, we give better estimations then the ones given recently by Cai, Wang and Xu.
متن کاملA comparison of typical ℓp minimization algorithms
Recently, compressed sensing has been widely applied to various areas such as signal processing, machine learning, and pattern recognition. To find the sparse representation of a vector w.r.t. a dictionary, an l1 minimization problem, which is convex, is usually solved in order to overcome the computational difficulty. However, to guarantee that the l1 minimizer is close to the sparsest solutio...
متن کامل